# Japanese text embedding
Plamo Embedding 1b
Apache-2.0
PLaMo-Embedding-1B is a Japanese text embedding model developed by Preferred Networks, demonstrating outstanding performance in Japanese text embedding benchmarks
Text Embedding
Transformers Japanese

P
pfnet
33.48k
25
Sarashina Embedding V1 1b
A text embedding model developed based on a 1.2 billion parameter Japanese large language model, excelling in JMTEB benchmark tests
Text Embedding
Transformers Supports Multiple Languages

S
sbintuitions
23.85k
31
Sentence Bert Base Ja Mean Tokens V2
This is a Japanese-specific Sentence-BERT model, which uses an improved loss function for training optimization compared to Version 1, achieving a 1.5 to 2 percentage point increase in accuracy.
Text Embedding Japanese
S
sonoisa
108.15k
42
Featured Recommended AI Models